Regularized Compression of a Noisy
نویسندگان
چکیده
Both regularization and compression are important issues in image processing and have been widely approached in the literature. The usual procedure to obtain the compression of an image given through a noisy blur requires two steps: first a deblurring step of the image and then a factorization step of the regularized image to get an approximation in terms of low rank nonnegative factors. We examine here the possibility of swapping the two steps by deblurring directly the noisy factors or partially denoised factors. The experimentation shows that in this way images with comparable regularized compression can be obtained with a lower computational cost.
منابع مشابه
Annotated Bibliography High-dimensional Statistical Inference
Recent research has studied the role of sparsity in high dimensional regression and signal reconstruction, establishing theoretical limits for recovering sparse models from sparse data. This line of work shows that l1-regularized least squares regression can accurately estimate a sparse linear model from n noisy examples in p dimensions, even if p is much larger than n. In this paper we study a...
متن کاملar X iv : 0 70 6 . 05 34 v 1 [ st at . M L ] 4 J un 2 00 7 Compressed Regression
Recent research has studied the role of sparsity in high dimensional regression and signal reconstruction, establishing theoretical limits for recovering sparse models from sparse data. This line of work shows that l1-regularized least squares regression can accurately estimate a sparse linear model from n noisy examples in p dimensions, even if p is much larger than n. In this paper we study a...
متن کاملStability Analysis for Regularized Least Squares Regression
We discuss stability for a class of learning algorithms with respect to noisy labels. The algorithms we consider are for regression, and they involve the minimization of regularized risk functionals, such as L(f) := 1 N PN i=1(f(xi) yi)+ kfkH. We shall call the algorithm ‘stable’ if, when yi is a noisy version of f (xi) for some function f 2 H, the output of the algorithm converges to f as the ...
متن کاملNumerical Differentiation for the Second Order Derivative of Functions with Several Variables
Abstract. We propose a regularized optimization problem for computing numerical differentiation for the second order derivative for functions with two variables from the noisy values of the function at scattered points, and give the proof of the existence and uniqueness of the solution of this problem. The reconstruction scheme is also given during the proof, which is based on biharmonic Green ...
متن کاملRegularized Autoregressive Multiple Frequency Estimation
The paper addresses a problem of tracking multiple number of frequencies using Regularized Autoregressive (RAR) approximation. The RAR procedure allows to decrease approximation bias, comparing to other AR-based frequency detection methods, while still providing competitive variance of sample estimates. We show that the RAR estimates of multiple periodicities are consistent in probabilit...
متن کامل